Review




Structured Review

Open Geospatial Consortium sensor model language (sensorml)
Sensor Model Language (Sensorml), supplied by Open Geospatial Consortium, used in various techniques. Bioz Stars score: 90/100, based on 1 PubMed citations. ZERO BIAS - scores, article reviews, protocol conditions and more
https://www.bioz.com/result/sensor model language (sensorml)/product/Open Geospatial Consortium
Average 90 stars, based on 1 article reviews
sensor model language (sensorml) - by Bioz Stars, 2026-04
90/100 stars

Images



Similar Products

97
Welcony time s eeg device egi 128 geodesic sensor net eye tracker tobii pro glasses 3 t raw eeg pretrained language model texts eye tracking
Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup. Participants were instructed to sit quietly approximately 67 cm from the screen and sequentially read the highlighted text. (c) The experimental protocol. Participants’ 128-channel EEG signals <t>and</t> <t>eye-tracking</t> data were recorded while reading the highlighted text. (d) The data modalities in the dataset. The dataset comprises raw data such as the original textual stimuli, eye movement data, EEG data, and derivatives such as text embeddings from pre-trained NLP models and pre-processed EEG data.
Time S Eeg Device Egi 128 Geodesic Sensor Net Eye Tracker Tobii Pro Glasses 3 T Raw Eeg Pretrained Language Model Texts Eye Tracking, supplied by Welcony, used in various techniques. Bioz Stars score: 97/100, based on 1 PubMed citations. ZERO BIAS - scores, article reviews, protocol conditions and more
https://www.bioz.com/result/time s eeg device egi 128 geodesic sensor net eye tracker tobii pro glasses 3 t raw eeg pretrained language model texts eye tracking/product/Welcony
Average 97 stars, based on 1 article reviews
time s eeg device egi 128 geodesic sensor net eye tracker tobii pro glasses 3 t raw eeg pretrained language model texts eye tracking - by Bioz Stars, 2026-04
97/100 stars
  Buy from Supplier

90
EPIRUS Inc smart computing models, sensors, and early diagnostic speech and language deficiencies indicators in child communication
Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup. Participants were instructed to sit quietly approximately 67 cm from the screen and sequentially read the highlighted text. (c) The experimental protocol. Participants’ 128-channel EEG signals <t>and</t> <t>eye-tracking</t> data were recorded while reading the highlighted text. (d) The data modalities in the dataset. The dataset comprises raw data such as the original textual stimuli, eye movement data, EEG data, and derivatives such as text embeddings from pre-trained NLP models and pre-processed EEG data.
Smart Computing Models, Sensors, And Early Diagnostic Speech And Language Deficiencies Indicators In Child Communication, supplied by EPIRUS Inc, used in various techniques. Bioz Stars score: 90/100, based on 1 PubMed citations. ZERO BIAS - scores, article reviews, protocol conditions and more
https://www.bioz.com/result/smart computing models, sensors, and early diagnostic speech and language deficiencies indicators in child communication/product/EPIRUS Inc
Average 90 stars, based on 1 article reviews
smart computing models, sensors, and early diagnostic speech and language deficiencies indicators in child communication - by Bioz Stars, 2026-04
90/100 stars
  Buy from Supplier

90
Open Geospatial Consortium sensor model language (sensorml)
Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup. Participants were instructed to sit quietly approximately 67 cm from the screen and sequentially read the highlighted text. (c) The experimental protocol. Participants’ 128-channel EEG signals <t>and</t> <t>eye-tracking</t> data were recorded while reading the highlighted text. (d) The data modalities in the dataset. The dataset comprises raw data such as the original textual stimuli, eye movement data, EEG data, and derivatives such as text embeddings from pre-trained NLP models and pre-processed EEG data.
Sensor Model Language (Sensorml), supplied by Open Geospatial Consortium, used in various techniques. Bioz Stars score: 90/100, based on 1 PubMed citations. ZERO BIAS - scores, article reviews, protocol conditions and more
https://www.bioz.com/result/sensor model language (sensorml)/product/Open Geospatial Consortium
Average 90 stars, based on 1 article reviews
sensor model language (sensorml) - by Bioz Stars, 2026-04
90/100 stars
  Buy from Supplier

90
Open Geospatial Consortium sensor model language (sml)
Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup. Participants were instructed to sit quietly approximately 67 cm from the screen and sequentially read the highlighted text. (c) The experimental protocol. Participants’ 128-channel EEG signals <t>and</t> <t>eye-tracking</t> data were recorded while reading the highlighted text. (d) The data modalities in the dataset. The dataset comprises raw data such as the original textual stimuli, eye movement data, EEG data, and derivatives such as text embeddings from pre-trained NLP models and pre-processed EEG data.
Sensor Model Language (Sml), supplied by Open Geospatial Consortium, used in various techniques. Bioz Stars score: 90/100, based on 1 PubMed citations. ZERO BIAS - scores, article reviews, protocol conditions and more
https://www.bioz.com/result/sensor model language (sml)/product/Open Geospatial Consortium
Average 90 stars, based on 1 article reviews
sensor model language (sml) - by Bioz Stars, 2026-04
90/100 stars
  Buy from Supplier

90
Open Geospatial Consortium sensor model language
Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup. Participants were instructed to sit quietly approximately 67 cm from the screen and sequentially read the highlighted text. (c) The experimental protocol. Participants’ 128-channel EEG signals <t>and</t> <t>eye-tracking</t> data were recorded while reading the highlighted text. (d) The data modalities in the dataset. The dataset comprises raw data such as the original textual stimuli, eye movement data, EEG data, and derivatives such as text embeddings from pre-trained NLP models and pre-processed EEG data.
Sensor Model Language, supplied by Open Geospatial Consortium, used in various techniques. Bioz Stars score: 90/100, based on 1 PubMed citations. ZERO BIAS - scores, article reviews, protocol conditions and more
https://www.bioz.com/result/sensor model language/product/Open Geospatial Consortium
Average 90 stars, based on 1 article reviews
sensor model language - by Bioz Stars, 2026-04
90/100 stars
  Buy from Supplier

90
Open Geospatial Consortium sensor model language (sensorml) specification
Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup. Participants were instructed to sit quietly approximately 67 cm from the screen and sequentially read the highlighted text. (c) The experimental protocol. Participants’ 128-channel EEG signals <t>and</t> <t>eye-tracking</t> data were recorded while reading the highlighted text. (d) The data modalities in the dataset. The dataset comprises raw data such as the original textual stimuli, eye movement data, EEG data, and derivatives such as text embeddings from pre-trained NLP models and pre-processed EEG data.
Sensor Model Language (Sensorml) Specification, supplied by Open Geospatial Consortium, used in various techniques. Bioz Stars score: 90/100, based on 1 PubMed citations. ZERO BIAS - scores, article reviews, protocol conditions and more
https://www.bioz.com/result/sensor model language (sensorml) specification/product/Open Geospatial Consortium
Average 90 stars, based on 1 article reviews
sensor model language (sensorml) specification - by Bioz Stars, 2026-04
90/100 stars
  Buy from Supplier

Image Search Results


Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup. Participants were instructed to sit quietly approximately 67 cm from the screen and sequentially read the highlighted text. (c) The experimental protocol. Participants’ 128-channel EEG signals and eye-tracking data were recorded while reading the highlighted text. (d) The data modalities in the dataset. The dataset comprises raw data such as the original textual stimuli, eye movement data, EEG data, and derivatives such as text embeddings from pre-trained NLP models and pre-processed EEG data.

Journal: Scientific data

Article Title: ChineseEEG: A Chinese Linguistic Corpora EEG Dataset for Semantic Alignment and Neural Decoding.

doi: 10.1038/s41597-024-03398-7

Figure Lengend Snippet: Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup. Participants were instructed to sit quietly approximately 67 cm from the screen and sequentially read the highlighted text. (c) The experimental protocol. Participants’ 128-channel EEG signals and eye-tracking data were recorded while reading the highlighted text. (d) The data modalities in the dataset. The dataset comprises raw data such as the original textual stimuli, eye movement data, EEG data, and derivatives such as text embeddings from pre-trained NLP models and pre-processed EEG data.

Article Snippet: Experiment setupEquipment Experiment protocol C ha nn el s Time (s)EEG device EGI 128 geodesic sensor net Eye tracker Tobii pro glasses 3 t Raw EEG Pretrained language model Texts Eye-tracking data Raw data Derivatives Text embeddings t t tTemporal alignment Pre-processed EEG t t t Pre-processsing Data modalities a b c d C ha nn el s Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup.

Techniques:

Fig. 3 File structure of the dataset. (a) Eye-tracking data: Each experimental run is associated with a .rar file that contains eye-tracking data. (b) Electrode information files: These include detailed information of electrodes such as the location, type, and sampling rate, as well as information on any channels marked as bad during pre- processing. (c) EEG data and event-related files: Including EEG data in BrainVision format and event files that record marker information. (d) ICA-related files: Containing independent components in numpy format, records of removed components during pre-processing, and topographic maps of the components. (e) Text materials: Containing original and segmented text. (f) Text embedding files: Each file corresponds to an experimental run and is stored in .npy format. (g) Raw EEG data.

Journal: Scientific data

Article Title: ChineseEEG: A Chinese Linguistic Corpora EEG Dataset for Semantic Alignment and Neural Decoding.

doi: 10.1038/s41597-024-03398-7

Figure Lengend Snippet: Fig. 3 File structure of the dataset. (a) Eye-tracking data: Each experimental run is associated with a .rar file that contains eye-tracking data. (b) Electrode information files: These include detailed information of electrodes such as the location, type, and sampling rate, as well as information on any channels marked as bad during pre- processing. (c) EEG data and event-related files: Including EEG data in BrainVision format and event files that record marker information. (d) ICA-related files: Containing independent components in numpy format, records of removed components during pre-processing, and topographic maps of the components. (e) Text materials: Containing original and segmented text. (f) Text embedding files: Each file corresponds to an experimental run and is stored in .npy format. (g) Raw EEG data.

Article Snippet: Experiment setupEquipment Experiment protocol C ha nn el s Time (s)EEG device EGI 128 geodesic sensor net Eye tracker Tobii pro glasses 3 t Raw EEG Pretrained language model Texts Eye-tracking data Raw data Derivatives Text embeddings t t tTemporal alignment Pre-processed EEG t t t Pre-processsing Data modalities a b c d C ha nn el s Fig. 1 Overview of the experiment and the modalities included in the dataset. (a) Equipment utilized in the experiment, including the EGI device for collecting EEG data and the Tobii Pro Glasses 3 eye-tracker for tracking eye movements. (b) The experiment setup.

Techniques: Sampling, Marker